Considering the time delay and the energy consumption of terminal equipment caused by high-speed data transmission and calculation, a transmission scheme with equal power allocation in uplink was proposed. Firstly, based on collaborative properties of Augment Reality (AR) services, a system model for AR characteristics was established. Secondly, system frame structure was analyzed in detail, and the constraints to minimize total energy consumption of system were established. Finally, with the time delay and energy consumption constraints satisfied, a mathematical model of Mobile Edge Computing (MEC) resource optimization based on convex optimization was established to obtain an optimal communication and computing resource allocation scheme. Compared with user independent transmission scheme, the total energy consumption of the proposed scheme with a maximum time delay of 0.1 s and 0.15 s was both 14.6%. The simulation results show that under the same conditions, compared with the optimization scheme based on user independent transmission, the equal power MEC optimization scheme considering cooperative transmission between users can significantly reduce the total energy consumption of system.
The breakdown of simulation nodes or shortage of simulation resources cause failure of distributed simulation system and lower down the reliability of simulation system. To improve fault tolerance performance and decrease the overhead of fault tolerance, a fault tolerance method based on virtualization technology was proposed. According to different failure locations, different fault tolerance methods were adopted. The optimization of checkpoint strategy was analyzed and the optimal interval was obtained. Including selection of nodes, the number of copies and the distribution of the copies, three main problems of backup strategy were concluded. The problems were solved through virtualization technology. Fault tolerance strategy based on virtual machine migration was proposed as a complementary of checkpoint strategy and backup strategy to decrease the overhead. The performance of dynamic fault tolerance strategy and normal fault tolerance strategy were evaluated through experiments. The experimental results prove that the proposed fault tolerance trategy is efficient and the overhead is kept at a low level.
As sensitivity and specificity of current microRNA identification methods are not ideal or imbalanced because of emphasizing new features but ignoring weak classification ability and redundancy of features. An ensemble algorithm based on feature clustering and random subspace method was proposed, named CLUSTER-RS. After eliminating some features with weak classification ability using information ratio, the algorithm utilized information entropy to measure feature relevance and grouped the features into clusters. Then it selected the same number of features randomly from each cluster to compose a feature set, which was used to train base classifiers for constituting the final identification model. By tuning parameter and selecting base classifiers to optimize the algorithm, experimental comparison of CLUSTER-RS and five classic microRNA identification methods (Triplet-SVM,miPred,MiPred,microPred,HuntMi) was conducted using latest microRNA dataset. CLUSTER-RS was only inferior to microPred in sensitivity and performed best in specificity, and also had advantage in accuracy and Matthew correlation coefficient. Experiments show that, CLUSTER-RS algorithm achieves good performance and is superior to the rivals in the aspect of balance between sensitivity and specificity.
To solve data quality problems for hydrological time series analysis and decision-making, a new prediction-based outlier detection algorithm was proposed. The method first split given hydrological time series into subsequences so as to build a forecasting model to predict future values, and then outliers were assumed to take place if the difference between predicted and observed values was above a certain threshold. The setup of sliding window and parameters in the detection algorithm were analyzed, and the corresponding result was validated with the real data. The experimental results show that the proposed algorithm can effectively detect the outliers in time series and improves the sensitivity and specificity to at least 80 percent and 98 percent respectively.
Time overhead of the taint propagation analysis in the off-line taint analysis is very large, so the research on efficient taint propagation has important significance. In order to solve the problem, an optimization method of taint propagation analysis based on semantic rules was proposed. This method defined semantic description rules for the instruction to describe taint propagation semantics, automatically generated the semantics of assembly instructions by using the intermediate language, and then analyzed taint propagation according to the semantic rules, to avoid the repeated semantic parsing caused by repeating instructions execution in the existing taint analysis method, thus improving the efficiency of taint analysis. The experimental results show that, this method can effectively reduce the time cost of taint propagation analysis, only costs 14% time of the taint analysis based on intermediate language.